seo

Matt Cutts on Google Webmaster Tools (Belated)

Here’s the last belated Matt Cutts video transcript. This one was posted on Matt’s blog on August 8th (and there’s a shout out to yours truly!). It exclusively has to do with the launch of Google Webmaster Central, which Rand covered in a previous blog post. Nonetheless, here’s the transcript:

Google Webmaster Tools

This is Matt Cutts. It’s Monday, August 7th, and it’s going to be the first day of SES. I’ve been picking SEOs’ brains since Saturday, so I’m already starting to lose my voice a little bit. But, I wanted to alert you to some stuff that people might have missed that just happened this past Friday. I think it might have gotten missed a little bit because it happened at about 9:00 on a Friday, and partly because a large fraction of the A list, B list, and C list bloggers about search are all on their way or are arriving at SES San Jose. [In SEOmoz’s case, we were having a little soiree at our office that evening.] Google’s actually done quite a bit more lately to revamp the amount of information we provide to general users and to webmasters.

So, one thing is Google.com/support has been beefed up a whole lot. So, all the different support stuff, there’s a lot more answers with a lot more fresh information. It’s pretty cool. If you go to Google.com/support, that’s sort of the one-stop shop for all sorts of general user support needs. However, if you found your way to this video [forcefully shoved by my boss is more like it], you’re probably not just a regular user. You’re probably also a webmaster [or SEOmoz slave; whatever].

If you’re a webmaster, there’s a tool you need to know about which used to be called Sitemaps until Friday. It all started out sometime last year when this tool called Sitemaps let people submit all the URLs that were on their sites. They could even say things like when they had last changed, which URLs were more important, all sorts of stuff. And a lot of people made tools to create those Sitemaps files, and that was fantastic. The thing that happened after that is the Sitemaps team decided to build a more general console, something that could help webmasters with all sorts of other problems.

So, that’s been called Sitemaps, but I know that Adam Lasnik came back from SES London, and he said when he talked about Sitemaps, everybody thought, β€œOh, XML files,” and stuff like that. So, just this last week, Sitemaps changed their name. There’s now an official area called Google Webmaster Central. And if you go to that, it’s just Google.com/webmaster, or webmasters–I’ll make sure they both work–you’ll get a set of lots of different tools.

There’s now an official Google Webmaster blog, which is mostly going to be maintained by Vanessa Fox. I’m sure I’ll stop by from time to time to weigh in on various things. That used to be the Sitemaps blog, and the scope of it is broadening to now include anything related to webmasters, which I think is fantastic.

The other thing is the Sitemaps tool has now become the Google Webmaster tools, and it’s got all sorts of stuff. It’s not just where you can tell people, β€œHere’s all the URLs I’ve got. Google, please come crawl these URLs.” Just off the top of my head, it’s got robots.txt checker, it’s got things to show you what errors and URLs it’s seen. Earlier today, in fact, I found where I had made a link without the http, and that doesn’t work so well in WordPress. So, I had gotten some 404 errors whenever Google tried to crawl. I was actually able to fix a broken link by looking at that table.

In some cases, we can tell whether you have spam penalties or not. If you have hidden text, or something like that, we can actually show you that you have a penalty and actually give you a reinclusion request, which we can then give a little more weight to because we know it’s you, you’ve verified and proved that you really own that site. They also just did a new release on Friday along with changing the name, and they introduced a lot of pretty neat little stuff, things like “Show me all the query words that show up in each subdirectory”, or “Show me the crawl errors in each subdirectory”, things like that.

However, the biggest thing that I’m really happy about is something called preferred domain. Sometimes we see whenever people have their links not as uniform. Maybe they don’t have all their ducks in a row, so some of their links point to www.mattcutts.com and some point to just mattcutts.com, so without the www or with the www. If some people from outside of you, like the ODP or whatever, links to one, and other people link to the other, Google tries to disambiguate that. It tries to figure out, β€œOh, www and non www are actually the same page, and they’re always going to be the same site.” But we can’t get that always 100% correct.

This new feature in Sitemaps, the Google Webmaster console, or Google Webmaster tools, whatever you want to call it, now lets you say, β€œOkay, I verify I own this domain, and I verify I own it with the www as well. Now, treat those the same.” Now, bear in mind it’s a preference, so first thing is it might take several weeks to go into effect. The next thing is it’s a preference, so we don’t 100% guarantee that if you say, β€œI want www” we’ll always go that way. But, in the normal, typical situation, within a few weeks you should see your URLs change from split between www or non www if you have this issue to all being on whichever one you prefer.

I volunteered my domain to be used as a guinea pig by the crawl guys, so they were whipping it back and forth from www to non www, and things looked like they were working pretty well. Props to Dayo_UK, who asked for this feature, and a bunch of other people who asked for this feature. I’m glad we’re getting around to it. I’m sure we’ll continue to keep looking for ways that we can take requests from webmasters and try to turn that into useful information that they can get.

If you haven’t taken a fresh look at the Google Webmaster tools, I would highly recommend that you do that. It’s worth your time, you can find all kinds of errors, you can test out robots.txt, you can sometimes see penalties, there’s words that you rank for, words that you rank for and you get clicked on a lot, and, most importantly, there’s this www and non www, so if you’ve been affected by that you can now tell Google which way you want it to be. The Sitemaps team has been doing a great job. I’m sure I’ll continue to call them Sitemaps for a while, not being able to get used to the name change, but I’ll get used to it eventually. I hope that you give it a try because I think it can be useful for anybody who’s got a site.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button